skin sensor
Skin-Machine Interface with Multimodal Contact Motion Classifier
Confente, Alberto, Jin, Takanori, Kobayashi, Taisuke, Guadarrama-Olvera, Julio Rogelio, Cheng, Gordon
This paper proposes a novel framework for utilizing skin sensors as a new operation interface of complex robots. The skin sensors employed in this study possess the capability to quantify multimodal tactile information at multiple contact points. The time-series data generated from these sensors is anticipated to facilitate the classification of diverse contact motions exhibited by an operator. By mapping the classification results with robot motion primitives, a diverse range of robot motions can be generated by altering the manner in which the skin sensors are interacted with. In this paper, we focus on a learning-based contact motion classifier employing recurrent neural networks. This classifier is a pivotal factor in the success of this framework. Furthermore, we elucidate the requisite conditions for software-hardware designs. Firstly, multimodal sensing and its comprehensive encoding significantly contribute to the enhancement of classification accuracy and learning stability. Utilizing all modalities simultaneously as inputs to the classifier proves to be an effective approach. Secondly, it is essential to mount the skin sensors on a flexible and compliant support to enable the activation of three-axis accelerometers. These accelerometers are capable of measuring horizontal tactile information, thereby enhancing the correlation with other modalities. Furthermore, they serve to absorb the noises generated by the robot's movements during deployment. Through these discoveries, the accuracy of the developed classifier surpassed 95 %, enabling the dual-arm mobile manipulator to execute a diverse range of tasks via the Skin-Machine Interface. https://youtu.be/UjUXT4Z4BC8
China uses VR eye tracking to gauge success of drug rehab
China's rehab centers are no strangers to using technology to treat addiction. The latest approach, however, is rather unusual. Shanghai drug rehab facilities (not pictured here) are trialing a combination of VR, eye tracking and skin sensors to both aid in recovery and gauge its effectiveness. Recovering addicts have to look at images and video illustrating the effects of drugs, and the eye monitoring can help determine their reactions, including whether or not they're paying attention in the first place. Think of it like a (relatively) gentler version of A Clockwork Orange's Ludovico treatment -- patients can't look away from the unpleasant imagery without their overseers knowing.
World Autism Awareness Day: Humanoid robot with skin sensors to help children in NHS trial
Kaspar, a humanoid robot designed to help children with autism, is set to be trialled by the NHS. The child-sized robot was created by researchers at the University of Hertfordshire and is programmed to respond to touch. Kaspar is designed to play games with children, using a collection of skin sensors placed on various parts of its body to "encourage certain tactile behaviours" and discourage "inappropriate" ones. It will be used to teach five-to-ten year-olds who have recently been diagnosed with Autism Spectrum Disorder (ASD) how to socialise and communicate. This is because research indicates that early intervention increases the likelihood of improved long-term outcomes for children with the condition.
- Europe > United Kingdom > England > Hertfordshire (0.25)
- Asia > Japan > Honshū > Kantō > Tokyo Metropolis Prefecture > Tokyo (0.22)
- North America > United States > California > Los Angeles County > Los Angeles (0.16)
- (12 more...)
- Health & Medicine > Therapeutic Area > Neurology > Autism (1.00)
- Health & Medicine > Therapeutic Area > Genetic Disease (0.85)
Google Translate Buttons For Health Care Are Coming
Canan Dagdeviren [JAH-naan DAH-day-vee-ren] is head of the new'Conformable Decoders' research group at MIT. Scientist Canan Dagdeviren is an interpreter for a language without words. She knows they're both saying something important, speaking the unique language of the body. It's a lexicon that's completely different from the Turkish and English that Dagdeviren speaks every day – but it's one she believes we need to start translating in earnest. She wants to count up our brain pulses, watch our temperature change in real time and observe how we breathe. This is different from the health-monitoring tech inside consumer Fitbits and smart watches that can count each step and monitor every heartbeat.